177 research outputs found
Machine Learning and the Future of Realism
The preceding three decades have seen the emergence, rise, and proliferation
of machine learning (ML). From half-recognised beginnings in perceptrons,
neural nets, and decision trees, algorithms that extract correlations (that is,
patterns) from a set of data points have broken free from their origin in
computational cognition to embrace all forms of problem solving, from voice
recognition to medical diagnosis to automated scientific research and
driverless cars, and it is now widely opined that the real industrial
revolution lies less in mobile phone and similar than in the maturation and
universal application of ML. Among the consequences just might be the triumph
of anti-realism over realism
Combining Functional Data Registration and Factor Analysis
We extend the definition of functional data registration to encompass a
larger class of registered functions. In contrast to traditional registration
models, we allow for registered functions that have more than one primary
direction of variation. The proposed Bayesian hierarchical model simultaneously
registers the observed functions and estimates the two primary factors that
characterize variation in the registered functions. Each registered function is
assumed to be predominantly composed of a linear combination of these two
primary factors, and the function-specific weights for each observation are
estimated within the registration model. We show how these estimated weights
can easily be used to classify functions after registration using both
simulated data and a juggling data set.Comment: The paper was updated with a better real data exampl
Asymptotic Properties for Methods Combining Minimum Hellinger Distance Estimates and Bayesian Nonparametric Density Estimates
In frequentist inference, minimizing the Hellinger distance between a kernel
density estimate and a parametric family produces estimators that are both
robust to outliers and statistically efficienty when the parametric model is
correct. This paper seeks to extend these results to the use of nonparametric
Bayesian density estimators within disparity methods. We propose two
estimators: one replaces the kernel density estimator with the expected
posterior density from a random histogram prior; the other induces a posterior
over parameters through the posterior for the random histogram. We show that it
is possible to adapt the mathematical machinery of efficient influence
functions from semiparametric models to demonstrate that both our estimators
are efficient in the sense of achieving the Cramer-Rao lower bound. We further
demonstrate a Bernstein-von-Mises result for our second estimator indicating
that it's posterior is asymptotically Gaussian. In addition, the robustness
properties of classical minimum Hellinger distance estimators continue to hold
- …